Computation of marginal likelihoods with data-dependent support for latent variables

نویسندگان

  • Sarah E. Heaps
  • Richard J. Boys
  • Malcolm Farrow
چکیده

Several Monte–Carlo methods have been proposed for computing marginal likelihoods in Bayesian analyses. Some of these involve sampling from a sequence of intermediate distributions between the prior and posterior. A difficulty arises if the support in the posterior distribution is a proper subset of that in the prior distribution. This can happen in problems involving latent variables whose support depends upon the data and can make some methods inefficient and others invalid. The correction required for models of this type is derived and its use is illustrated by finding the marginal likelihoods in two examples. One concerns a model for competing risks. The other involves a zero–inflated over–dispersed Poisson model for counts of centipedes, using latent Gaussian variables to capture spatial dependence.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast computation of the deviance information criterion for latent variable models

The deviance information criterion (DIC) has been widely used for Bayesian model comparison. However, recent studies have cautioned against the use of the DIC for comparing latent variable models. In particular, the DIC calculated using the conditional likelihood (obtained by conditioning on the latent variables) is found to be inappropriate, whereas the DIC computed using the integrated likeli...

متن کامل

Conjugate and conditional conjugate Bayesian analysis of discrete graphical models of marginal independence

A conjugate and conditional conjugate Bayesian analysis is presented for bi-directed discrete graphical models, which are used to describe and estimate marginal associations between categorical variables. To achieve this, each bi-directed graph is re-expressed by a Markov equivalent, over the observed margin, directed acyclic graph (DAG). This DAG equivalent model is obtained using the same ver...

متن کامل

Learning Deep Latent Gaussian Models with Markov Chain Monte Carlo

Deep latent Gaussian models are powerful and popular probabilistic models of highdimensional data. These models are almost always fit using variational expectationmaximization, an approximation to true maximum-marginal-likelihood estimation. In this paper, we propose a different approach: rather than use a variational approximation (which produces biased gradient signals), we use Markov chain M...

متن کامل

Multi-Conditional Learning for Joint Probability Models with Latent Variables

We introduce Multi-Conditional Learning, a framework for optimizing graphical models based not on joint likelihood, or on conditional likelihood, but based on a product of several marginal conditional likelihoods each relying on common sets of parameters from an underlying joint model and predicting different subsets of variables conditioned on other subsets. When applied to undirected models w...

متن کامل

Discriminative Mixtures of Sparse Latent Fields for Risk Management

We describe a simple and efficient approach to learning structures of sparse high-dimensional latent variable models. Standard algorithms either learn structures of specific predefined forms, or estimate sparse graphs in the data space ignoring the possibility of the latent variables. In contrast, our method learns rich dependencies and allows for latent variables that may confound the relation...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Computational Statistics & Data Analysis

دوره 71  شماره 

صفحات  -

تاریخ انتشار 2014